35 research outputs found

    Effects of Amount of Information on Overconfidence

    Get PDF
    Title: Effects of amount of information on overconfidenceAuthors: Tsai, Claire; Klayman, Joshua; Hastie, ReidAffiliation: The University of ChicagoAbstract: When a person makes a judgment based on evidence and assesses confidence in that judgment, what is the effect of providing more judgment-relevant information? Findings by Oskamp (1965) and by Slovic and Corrigan (1977) suggest that more information leads to increasing overconfidence. , We replicate the finding that receiving more information leads judges to increase their confidence even when their predictive accuracy does not improve. We identify some likely candidates for cues people use to judge confidence that do not correlate well with actual accuracy

    Reputation Agent: Prompting Fair Reviews in Gig Markets

    Full text link
    Our study presents a new tool, Reputation Agent, to promote fairer reviews from requesters (employers or customers) on gig markets. Unfair reviews, created when requesters consider factors outside of a worker's control, are known to plague gig workers and can result in lost job opportunities and even termination from the marketplace. Our tool leverages machine learning to implement an intelligent interface that: (1) uses deep learning to automatically detect when an individual has included unfair factors into her review (factors outside the worker's control per the policies of the market); and (2) prompts the individual to reconsider her review if she has incorporated unfair factors. To study the effectiveness of Reputation Agent, we conducted a controlled experiment over different gig markets. Our experiment illustrates that across markets, Reputation Agent, in contrast with traditional approaches, motivates requesters to review gig workers' performance more fairly. We discuss how tools that bring more transparency to employers about the policies of a gig market can help build empathy thus resulting in reasoned discussions around potential injustices towards workers generated by these interfaces. Our vision is that with tools that promote truth and transparency we can bring fairer treatment to gig workers.Comment: 12 pages, 5 figures, The Web Conference 2020, ACM WWW 202

    Confirmation, Disconfirmation, and Information in Hypothesis Testing

    No full text
    Strategies for hypothesis testing in scientific investigation and everyday reasoning have interested both psychologists and philosophers. A number of these scholars stress the importance of disconnrmation in reasoning and suggest that people are instead prone to a general deleterious "confirmation bias." In particular, it is suggested that people tend to test those cases that have the best chance of verifying current beliefs rather than those that have the best chance of falsifying them. We show, however; that many phenomena labeled "confirmation bias" are better understood in terms of a general positive test strategy. With this strategy, there is a tendency to test cases that are expected (or known) to have the property of interest rather than those expected (or known) to lack that property. This strategy is not equivalent to confirmation bias in the first sense; we show that the positive test strategy can be a very good heuristic for determining the truth or falsity of a hypothesis under realistic conditions. It can, however, lead to systematic errors or inefficiencies. The appropriateness of human hypothesis-testing strategies and prescriptions about optimal strategies must be understood in terms of the interaction between the strategy and the task at hand

    Effects of amount of information on judgment accuracy and confidence

    No full text
    When a person evaluates his or her confidence in a judgment, what is the effect of receiving more judgment-relevant information? We report three studies that show when judges receive more information, their confidence increases more than their accuracy, producing substantial confidence-accuracy discrepancies. Our results suggest that judges do not adjust for the cognitive limitations that reduce their ability to use additional information effectively. We place these findings in a more general framework of understanding the cues to confidence that judges use and how those cues relate to accuracy and calibration.Judgment Confidence Accuracy Football Overconfidence Calibration
    corecore